|
Given an matrix of rank , a rank decomposition or rank factorization of is a product , where is an matrix and is an matrix. Every finite-dimensional matrix has a rank decomposition: Let be an matrix whose column rank is . Therefore, there are linearly independent columns in ; equivalently, the dimension of the column space of is . Let be any basis for the column space of and place them as column vectors to form the matrix . Therefore, every column vector of is a linear combination of the columns of . To be precise, if is an matrix with as the -th column, then : where 's are the scalar coefficients of in terms of the basis . This implies that , where is the -th element of . == rank(A) = rank(AT) == An immediate consequence of rank factorization is that the rank of is equal to the rank of its transpose . Since the columns of are the rows of , the column rank of equals its row rank. Proof: To see why this is true, let us first define rank to mean column rank. Since , it follows that . From the definition of matrix multiplication, this means that each column of is a linear combination of the columns of . Therefore, the column space of is contained within the column space of and, hence, rank() ≤ rank(). Now, is ×, so there are columns in and, hence, rank() ≤ = rank(). This proves that rank( ≤ rank(). Now apply the result to to obtain the reverse inequality: since = , we can write rank() = rank( ≤ rank(). This proves rank( ≤ rank(). We have, therefore, proved rank( ≤ rank() and rank() ≤ rank(), so rank() = rank(). (Also see the first proof of column rank = row rank under rank). == Rank factorization from row echelon forms == In practice, we can construct one specific rank factorization as follows: we can compute , the reduced row echelon form of . Then is obtained by removing from all non-pivot columns, and by eliminating all zero rows of . 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Rank factorization」の詳細全文を読む スポンサード リンク
|